منابع مشابه
Composite Kernel Optimization in Semi-Supervised Metric
Machine-learning solutions to classification, clustering and matching problems critically depend on the adopted metric, which in the past was selected heuristically. In the last decade, it has been demonstrated that an appropriate metric can be learnt from data, resulting in superior performance as compared with traditional metrics. This has recently stimulated a considerable interest in the to...
متن کاملKernel Methods for Unsupervised Learning Kernel Methods for Unsupervised Learning Title: Kernel Methods for Unsupervised Learning
Kernel Methods are algorithms that projects input data by a nonlinear mapping in a new space (Feature Space). In this thesis we have investigated Kernel Methods for Unsupervised learning, namely Kernel Methods that do not require targeted data. Two classical unsupervised learning problems using Kernel Methods have been tackled. The former is the Data Dimensionality Estimation, the latter is the...
متن کاملEigenvoice Speaker Adaptation via Composite Kernel PCA
Eigenvoice speaker adaptation has been shown to be effective when only a small amount of adaptation data is available. At the heart of the method is principal component analysis (PCA) employed to find the most important eigenvoices. In this paper, we postulate that nonlinear PCA, in particular kernel PCA, may be even more effective. One major challenge is to map the feature-space eigenvoices ba...
متن کاملA composite kernel for named entity recognition
In this paper, we propose a novel kernel function for support vector machines (SVM) that can be used for sequential labeling tasks like named entity recognition (NER). Machine learning methods like support vector machines, maximum entropy, hidden Markov model and conditional random fields are the most widely used methods for implementing NER systems. The features used in machine learning algori...
متن کاملVariable Sparsity Kernel Learning Variable Sparsity Kernel Learning
This paper presents novel algorithms and applications for a particular class of mixed-norm regularization based Multiple Kernel Learning (MKL) formulations. The formulations assume that the given kernels are grouped and employ l1 norm regularization for promoting sparsity within RKHS norms of each group and lq, q ≥ 2 norm regularization for promoting non-sparse combinations across groups. Vario...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2009
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-009-5150-6